Skip to content

MkLlm

Show source on GitHub

Node for LLM-based text generation.

Example: Regular

Jinja

{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}

Python

MkLlm('Write a poem about MkDocs', [])

In the realm where knowledge meets the code,
A beacon shines, a digital road,
With MkDocs poised, sleek and refined,
Crafting documents, beautifully aligned.

From Markdown’s breath, ideas take flight,
A symphony of text, in structured delight,
With themes to adorn, like garments to wear,
Your project’s tale unfolds, with style and care.

Navigation flows like a river so wide,
Sections and pages in seamless glide,
Search boxes eager, to fetch what you seek,
Every query a whisper, every answer unique.

Versioned histories, like footprints in sand,
Marking the journey of thoughts through the land,
With each build a promise, a story reborn,
From dusk until dawn, innovation is worn.

GitHub and pals, a toolkit in hand,
Deploy with a push, it's all so well-planned,
Continuous whispers of collaboration’s glee,
In the heart of the code, we grow and we be.

So raise up your glasses, to documentation divine,
To MkDocs, the faithful, a lifeline in time,
For knowledge is power, and power we wield,
In the gardens of sharing, together we’ll build.

In the realm where knowledge meets the code,  
A beacon shines, a digital road,  
With MkDocs poised, sleek and refined,  
Crafting documents, beautifully aligned.  

From Markdown’s breath, ideas take flight,  
A symphony of text, in structured delight,  
With themes to adorn, like garments to wear,  
Your project’s tale unfolds, with style and care.  

Navigation flows like a river so wide,  
Sections and pages in seamless glide,  
Search boxes eager, to fetch what you seek,  
Every query a whisper, every answer unique.  

Versioned histories, like footprints in sand,  
Marking the journey of thoughts through the land,  
With each build a promise, a story reborn,  
From dusk until dawn, innovation is worn.  

GitHub and pals, a toolkit in hand,  
Deploy with a push, it's all so well-planned,  
Continuous whispers of collaboration’s glee,  
In the heart of the code, we grow and we be.  

So raise up your glasses, to documentation divine,  
To MkDocs, the faithful, a lifeline in time,  
For knowledge is power, and power we wield,  
In the gardens of sharing, together we’ll build.  
<p>In the realm where knowledge meets the code,<br>
A beacon shines, a digital road,<br>
With MkDocs poised, sleek and refined,<br>
Crafting documents, beautifully aligned.  </p>
<p>From Markdown’s breath, ideas take flight,<br>
A symphony of text, in structured delight,<br>
With themes to adorn, like garments to wear,<br>
Your project’s tale unfolds, with style and care.  </p>
<p>Navigation flows like a river so wide,<br>
Sections and pages in seamless glide,<br>
Search boxes eager, to fetch what you seek,<br>
Every query a whisper, every answer unique.  </p>
<p>Versioned histories, like footprints in sand,<br>
Marking the journey of thoughts through the land,<br>
With each build a promise, a story reborn,<br>
From dusk until dawn, innovation is worn.  </p>
<p>GitHub and pals, a toolkit in hand,<br>
Deploy with a push, it's all so well-planned,<br>
Continuous whispers of collaboration’s glee,<br>
In the heart of the code, we grow and we be.  </p>
<p>So raise up your glasses, to documentation divine,<br>
To MkDocs, the faithful, a lifeline in time,<br>
For knowledge is power, and power we wield,<br>
In the gardens of sharing, together we’ll build.  </p>

Bases: MkText

text property

text: str

__init__

__init__(
    user_prompt: str,
    system_prompt: str | None = None,
    model: str = "openai:gpt-4o-mini",
    context: str | None = None,
    extra_files: Sequence[str | PathLike[str]] | None = None,
    **kwargs: Any
)

Parameters:

Name Type Description Default
user_prompt str

Main prompt for the LLM

required
system_prompt str | None

System prompt to set LLM behavior

None
model str

LLM model identifier to use

'openai:gpt-4o-mini'
context str | None

Main context string

None
extra_files Sequence[str | PathLike[str]] | None

Additional context files or strings

None
kwargs Any

Keyword arguments passed to parent

{}
Name Children Inherits
MkText
mknodes.basenodes.mktext
Class for any Markup text.
graph TD
  94272695801712["mkllm.MkLlm"]
  94272697805536["mktext.MkText"]
  94272697820320["mknode.MkNode"]
  94272697777776["node.Node"]
  139836355973312["builtins.object"]
  94272697805536 --> 94272695801712
  94272697820320 --> 94272697805536
  94272697777776 --> 94272697820320
  139836355973312 --> 94272697777776
/home/runner/work/mknodes/mknodes/mknodes/templatenodes/mkllm/metadata.toml
[metadata]
icon = "mdi:view-grid"
status = "new"
name = "MkLlm"

[examples.regular]
title = "Regular"
jinja = """
{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}
"""

# [output.markdown]
# template = """
# <div class="grid cards" markdown="1">

# {% for item in node.items %}
# -   {{ item | indent }}
# {% endfor %}
# </div>
# """
mknodes.templatenodes.mkllm.MkLlm
class MkLlm(mktext.MkText):
    """Node for LLM-based text generation."""

    ICON = "material/format-list-group"
    REQUIRED_PACKAGES = [resources.Package("litellm")]

    def __init__(
        self,
        user_prompt: str,
        system_prompt: str | None = None,
        model: str = "openai:gpt-4o-mini",
        context: str | None = None,
        extra_files: Sequence[str | os.PathLike[str]] | None = None,
        **kwargs: Any,
    ):
        """Constructor.

        Args:
            user_prompt: Main prompt for the LLM
            system_prompt: System prompt to set LLM behavior
            model: LLM model identifier to use
            context: Main context string
            extra_files: Additional context files or strings
            kwargs: Keyword arguments passed to parent
        """
        super().__init__(**kwargs)
        self.user_prompt = user_prompt
        self.system_prompt = system_prompt
        self._model = model
        self._context = context
        self._extra_files = extra_files or []

    def _process_extra_files(self) -> list[str]:
        """Process extra context items, reading files if necessary.

        Returns:
            List of context strings.
        """
        context_items: list[str] = []

        def process_dir(path: UPath) -> list[str]:
            return [f.read_text() for f in path.rglob("*") if f.is_file()]

        for item in self._extra_files:
            try:
                path = UPath(item)
                if path.is_file():
                    context_items.append(path.read_text())
                elif path.is_dir():
                    context_items.extend(process_dir(path))
                else:
                    context_items.append(str(item))
            except Exception as exc:
                err_msg = f"Failed to read context file: {item}"
                logger.warning(err_msg)
                raise ValueError(err_msg) from exc

        return context_items

    @property
    def text(self) -> str:
        """Generate text using the LLM.

        Returns:
            Generated text content.
        """
        context_items = self._process_extra_files()
        combined_context = (
            "\n".join(filter(None, [self._context, *context_items])) or None
        )

        return complete_llm(
            self.user_prompt,
            self.system_prompt or "",
            model=self._model,
            context=combined_context or "",
        )